2 research outputs found
Social Bootstrapping: How Pinterest and Last.fm Social Communities Benefit by Borrowing Links from Facebook
How does one develop a new online community that is highly engaging to each
user and promotes social interaction? A number of websites offer friend-finding
features that help users bootstrap social networks on the website by copying
links from an established network like Facebook or Twitter. This paper
quantifies the extent to which such social bootstrapping is effective in
enhancing a social experience of the website. First, we develop a stylised
analytical model that suggests that copying tends to produce a giant connected
component (i.e., a connected community) quickly and preserves properties such
as reciprocity and clustering, up to a linear multiplicative factor. Second, we
use data from two websites, Pinterest and Last.fm, to empirically compare the
subgraph of links copied from Facebook to links created natively. We find that
the copied subgraph has a giant component, higher reciprocity and clustering,
and confirm that the copied connections see higher social interactions.
However, the need for copying diminishes as users become more active and
influential. Such users tend to create links natively on the website, to users
who are more similar to them than their Facebook friends. Our findings give new
insights into understanding how bootstrapping from established social networks
can help engage new users by enhancing social interactivity.Comment: Proc. 23rd International World Wide Web Conference (WWW), 201
Generalized Multiple Intent Conditioned Slot Filling
Natural language understanding includes the tasks of intent detection
(identifying a user's objectives) and slot filling (extracting the entities
relevant to those objectives). Prior slot filling methods assume that each
intent type cannot occur more than once within a message, however this is often
not a valid assumption for real-world settings. In this work, we generalize
slot filling by removing the constraint of unique intents in a message. We cast
this as a JSON generation task and approach it using a language model. We
create a pre-training dataset by combining DBpedia and existing slot filling
datasets that we convert for JSON generation. We also generate an in-domain
dataset using GPT-3. We train T5 models for this task (with and without
exemplars in the prompt) and find that both training datasets improve
performance, and that the model is able to generalize to intent types not seen
during training